Adaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression
نویسندگان
چکیده
Kernel based machine learning techniques have been widely used to tackle problems of function approximation and regression estimation. Relevance vector machine (RVM) has state of the art performance in sparse regression. As a popular and competent kernel function in machine learning, conventional Gaussian kernel has unified kernel width with each of basis functions, which make impliedly a basic assumption: the response is represented below certain frequency and the noise is represented above such certain frequency. However, in many case, this assumption does not hold. To overcome this limitation, a novel adaptive spherical Gaussian kernel is utilized for nonlinear regression, and the stagewise optimization algorithm for maximizing Bayesian evidence in sparse Bayesian learning framework is proposed for model selection. Extensive empirical study, on two artificial datasets and two real-world benchmark datasets, shows its effectiveness and flexibility of model on representing regression problem with higher levels of sparsity and better performance than classical RVM. The attractive ability of this approach is to automatically choose the right kernel widths locally fitting RVs from the training dataset, which could keep right level smoothing at each scale of signal. 2008 Elsevier Ltd. All rights reserved.
منابع مشابه
Utilizing Kernel Adaptive Filters for Speech Enhancement within the ALE Framework
Performance of the linear models, widely used within the framework of adaptive line enhancement (ALE), deteriorates dramatically in the presence of non-Gaussian noises. On the other hand, adaptive implementation of nonlinear models, e.g. the Volterra filters, suffers from the severe problems of large number of parameters and slow convergence. Nonetheless, kernel methods are emerging solutions t...
متن کاملThe evidence framework applied to sparse kernel logistic regression
In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based on the evidence framework introduced by MacKay. The principal innovation lies in the re-parameterisation of the model such that the usual spherical Gaussian prior over the parameters in the kernel induced feature space also corresponds to a spherical Gaussian prior over t...
متن کاملBayesian Approximate Kernel Regression with Variable Selection
Nonlinear kernel regression models are often used in statistics and machine learning due to greater accuracy than linear models. Variable selection for kernel regression models is a challenge partly because, unlike the linear regression setting, there is no clear concept of an effect size for regression coefficients. In this paper, we propose a novel framework that provides an analog of the eff...
متن کاملBayesian Generalized Kernel Mixed Models
We propose a fully Bayesian methodology for generalized kernel mixed models (GKMMs), which are extensions of generalized linear mixed models in the feature space induced by a reproducing kernel. We place a mixture of a point-mass distribution and Silverman’s g-prior on the regression vector of a generalized kernel model (GKM). This mixture prior allows a fraction of the components of the regres...
متن کاملEfficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations
Sparse approximations to Bayesian inference for nonparametric Gaussian Process models scale linearly in the number of training points, allowing for the application of powerful kernel-based models to large datasets. We present a general framework based on the informative vector machine (IVM) (Lawrence et al., 2003) and show how the complete Bayesian task of inference and learning of free hyperpa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Expert Syst. Appl.
دوره 36 شماره
صفحات -
تاریخ انتشار 2009